# Bilingual Optimization (Chinese-English)
Minicpm3 4B
Apache-2.0
MiniCPM3-4B is the third-generation model in the MiniCPM series, with overall performance surpassing Phi-3.5-mini-Instruct and GPT-3.5-Turbo-0125, comparable to several recent 7B~9B-scale models.
Large Language Model
Transformers Supports Multiple Languages

M
openbmb
15.94k
414
Baichuan 7B
Baichuan-7B is an open-source large-scale pre-trained language model developed by Baichuan Intelligence, based on the Transformer architecture with 7 billion parameters. Trained on bilingual Chinese-English corpus, it supports a context window of 4096 tokens.
Large Language Model
Transformers Supports Multiple Languages

B
baichuan-inc
20.47k
840
Ziya LLaMA 13B Pretrain V1
Gpl-3.0
A large-scale pre-trained model with 13 billion parameters based on the LLaMa architecture, optimized for Chinese tokenization, completing 110 billion tokens of incremental pre-training in Chinese and English, significantly improving Chinese generation and comprehension capabilities
Large Language Model
Transformers Supports Multiple Languages

Z
IDEA-CCNL
113
20
Featured Recommended AI Models